Annals of Emerging Technologies in Computing (AETiC)

 
Table of Contents

         Table of Contents (Volume #10, Issue #2)


 
Cover Page

         Cover Page (Volume #10, Issue #2)


 
Editorial

         Editorial (Volume #10, Issue #2)


 
Paper #1                                                                             

Advanced Machine Learning and Deep Learning Techniques for Enhanced Cattle Identification and Detection: A Comprehensive Review

Fayazunnesa Chowdhury, Syed Md. Galib, Md Nasim Adnan, Md. Moradul Siddique, Md Robiul Karim and K M Tanvir Anjum


Abstract: The need for effective cattle identification technology is now more acutely felt than ever in maintaining biosecurity, food safety, and supply chain efficacy in livestock management. This paper presents a systematic review of recent research in cattle identification using machine learning and deep learning techniques. The present systematic review measures the effectiveness of traditional and modern cattle identification techniques using studies from major academic databases, where articles were subjected to full-text review. Among these techniques, classical Machine Learning Techniques such as K-Nearest Neighbors and Support Vector Machines have demonstrated good results in cattle identification; however, Deep Learning Techniques, such as Convolutional Neural Networks, Residual Networks, and You Only Look Once, are better in cognition, detection, and identification tasks. Feature extraction relies on common techniques like Local Binary Pattern (LBP), Speeded-Up Robust Features (SURF), and Scale-Invariant Feature Transform (SIFT), while key features commonly used in these studies include muzzle prints and coat patterns. The review highlights key hurdles involving cattle identification, such as the limited number of publicly accessible datasets, issues with data quality susceptible to environmental changes and animal mobility, and high demand for real-time processing ability. The paper aims to inform researchers, policymakers, and stakeholders about implementing scalable, humane, and effective cattle identification systems to achieve sustainable livestock management.


Keywords: Cattle Detection; Cattle Identification; Deep Learning; Livestock Management; Machine Learning.


Download Full Text


 
Paper #2                                                                             

Enhanced Database Security using Hybrid GA-PSO for Parallel Elliptic Curve Cryptographic Scheduling with Offline Optimisation

Safaa Salam Hatem and Fahad Naim Nife


Abstract: Modern database systems increasingly employ cryptographic primitives; however, the significant computational overhead of encryption and decryption represents a major performance bottleneck in high-throughput environments. In particular, Elliptic Curve Cryptography (ECC) is of interest because it provides strong security guarantees with compact key sizes and relatively efficient arithmetic, thus making ECC suitable for resource-constrained platforms and latency-sensitive services. Realistic database workloads require the concurrent execution of many ECC scalar multiplications, such as batch encryption pipelines, parallel client authentication, and transaction-intensive workloads, hence inducing a non-trivial scheduling problem on modern multi-core and heterogeneous architectures. This work introduces a hybrid Genetic Algorithm-Particle Swarm Optimisation (GA-PSO) framework that acts as an offline configuration mechanism for optimizing parallel ECC operation execution on heterogeneous resources, which runs in a pre-computation phase: the optimisation routine is executed only once during system initialization, or when the database is deployed on new hardware, and the scheduling parameters are cached and reused for the system's operational lifetime. In this way, the optimisation cost is amortized over millions of subsequent cryptographic operations, reconciling the apparent trade-off between sophisticated scheduling strategies and the stringent performance constraints of real-time database workloads; Moreover, the new scheduler has a four-dimensional decision space: ECC window size, core assignment, run ordering, and memory placement, and it utilizes a five-dimensional score function which considers runtime, memory footprint, load balancing, energy consumption, and schedule predictability. Experimental evaluation on the widely deployed secp256r1 curve shows that the Offline GA-PSO scheduler improves the execution time by 4.2% compared to the dynamic Work-Stealing baseline, by 9.8% compared to Greedy SJF and by 24.5% compared to Round-Robin, and given an offline optimisation cost of 0.85 s and a per-batch saving of 9.5 ms relative to the strongest runtime baseline, this overhead is amortised after approximately 4,500 scalar multiplications. Overall, our results indicate that offline meta-heuristic scheduling is a viable and efficient building block for smart, crypto-aware resource management in secure high-performance database systems, and moreover, the experimental evaluation shows that on the target multi-core platform the proposed offline hybrid GA-PSO scheduler converges most frequently to an ECC window size of 8, as this choice offers the best trade-off between precomputation overhead and scalar multiplication speed.


Keywords: Cryptographic scheduling; Database security; Elliptic Curve Cryptography (ECC); Genetic Algorithm (GA); Multi-core optimisation; Offline optimisation; Particle Swarm Optimisation (PSO); Resource management.


Download Full Text

----------------------------------------------------------------------------------------->
 
Paper #3                                                                             

Understanding the Traffic Sign through a Deep CNN Architecture

Naima Islam, Sajeeb Kumar Ray, Md Mynoddin, Md. Tofael Ahmed, Md. Zahid Hasan, Sheikh Mohammad Jawad and Md. Anwar Hossain


Abstract: Accurate traffic sign classification is essential for intelligent transportation systems and automated vehicles to ensure safety and navigation efficiency. Our research showcases a specialized CNN model focused on achieving precision in traffic sign detection, utilizing the GTSRB dataset for training. The model incorporates advanced methodologies, including skip connections and bilinear interpolation, to mitigate issues like image noise and low resolution. Skip connections preserve vital features across layers to prevent information loss during training, while bilinear interpolation enhances image clarity for improved recognition under various real-world conditions. The architecture consists of multiple convolutional and pooling layers optimized for extracting and maintaining detailed features crucial for accurate classification. With these improvements, the suggested model achieves a remarkable accuracy rate of 99.78%, indicating its competence in identifying traffic signs. Additionally, the model was evaluated on the TT-100K dataset and the Traffic Sign Classification dataset, with accuracies of 99.78% and 98.86%, respectively. This precision showcases the strength and flexibility of the model, confirming its reliability in the development of advanced transportation technologies, while highlighting its practical usefulness in real-world traffic monitoring systems. By enhancing the reliability of traffic sign recognition systems, this research addresses current shortcomings and significantly improves the safety of automated vehicles and road users. The results emphasize the potential of innovative computer vision techniques in traffic sign classification, fostering progress in intelligent transportation systems and automated driving technologies. This investigation represents a crucial advancement toward bridging theoretical research and practical implementation, thereby improving the reliability and safety of contemporary transportation networks.


Keywords: Computer Vision; Convolutional neural network; GTSRB; Image Processing; Intelligent transportation systems; Traffic sign.


Download Full Text


 
Paper #4                                                                             

Effective Control and Management of Pump Station Electrical Equipment

Yerzhan Abdykenov, Karshiga Smagulova, Abror Pulatov, Shuxrat Umarov and Aisaule Zheldikbaeva


Abstract: This study presents an integrated methodological framework for optimizing the control and management of electrical power equipment in municipal pump stations through the combined application of SCADA architecture and adaptive variable frequency drive (VFD) regulation. Unlike conventional approaches based on fixed operational set-points or isolated equipment-level upgrades, the proposed framework enables system-level evaluation of energy efficiency, hydraulic stability, operational reliability, and fault response performance. A mathematical model of pump station operation was developed in MATLAB/Simulink, incorporating key variables such as energy consumption, pipeline pressure, volumetric flow rate, shutdown frequency, and fault recovery time. The framework was validated using real operational data from three municipal pump stations in Taraz, Kazakhstan, representing heterogeneous infrastructure conditions. The results demonstrate a reduction in electrical energy consumption by 15-20%, a decrease in pump shutdown frequency by 50-70%, and a reduction in mean fault recovery time from 4-6 hours to 1-2 hours. In addition, the system reliability coefficient increased from 0.75-0.89 to 0.95-0.98. The simulation results showed strong agreement with field data, with model prediction errors not exceeding 6-8% for key operational parameters. Economic analysis indicates a payback period of approximately 2.2-2.5 years following modernization. The proposed framework provides a transferable decision-support methodology for evaluating and implementing energy-efficient and reliability-oriented control strategies in municipal water supply systems, particularly for aging infrastructure operating under variable hydraulic conditions.


Keywords: Automation; Electrical power engineering; Energy efficiency; Frequency converters; Pump stations; SCADA systems; Water resource management.


Download Full Text


 
Paper #5                                                                             

Heterogeneous IoT User Association and Channel Resource Joint Scheduling Method Based on MADDQN and DMADDPG

Yankai Xie


Abstract: The large-scale access of multiple types of terminals in the heterogeneous IoT makes it difficult to balance system performance and scalability due to the strong coupling relationship between user association and channel resource scheduling. Existing deep reinforcement learning methods still have shortcomings in multi-agent collaborative decision-making, hybrid discrete-continuous action modeling, and dynamic environment adaptability. Therefore, this study proposes a joint scheduling method of user association and channel resources based on MADDQN and DMADDPG. This method decouples discrete user scheduling and continuous resource allocation, and achieves multi-agent collaborative optimization under a centralized training and distributed execution framework. The results showed that the total throughput of the research method reached 468.52 Mbps, which was 116.42 Mbps higher than the weighted minimum mean square error of 352.10 Mbps. When the user scale was expanded to 200, the non-compliance rate of the research method was only 16.82%, which was 32.30% lower than the DMADDPG algorithm, and still maintained an average rate of 2.48 Mbps. In summary, the proposed method has good robustness and scalability while improving system performance, and provides an effective solution for large-scale heterogeneous IoT resource scheduling.


Keywords: Channel resource allocation; Double deep Q-network; Heterogeneous Internet of Things; Multi-agent reinforcement learning; User association.


Download Full Text

 
 International Association for Educators and Researchers (IAER), registered in England and Wales - Reg #OC418009                        Copyright IAER 2026